Gradient-Based Adaptive Stochastic Search for Non-Differentiable Optimization

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Distributed Stochastic Optimization via Adaptive Stochastic Gradient Descent

Stochastic convex optimization algorithms are the most popular way to train machine learning models on large-scale data. Scaling up the training process of these models is crucial in many applications, but the most popular algorithm, Stochastic Gradient Descent (SGD), is a serial algorithm that is surprisingly hard to parallelize. In this paper, we propose an efficient distributed stochastic op...

متن کامل

Adaptive Stochastic Conjugate Gradient Optimization for Temporal Medical Image Registration

We propose an Adaptive Stochastic Conjugate Gradient (ASCG) optimization algorithm for temporal medical image registration. This method combines the advantages of Conjugate Gradient (CG) method and Adaptive Stochastic Gradient Descent (ASGD) method. The main idea is that the search direction of ASGD is replaced by stochastic approximations of the conjugate gradient of the cost function. In addi...

متن کامل

Adaptive search with stochastic acceptance probabilities for global optimization

We present an extension of continuous domain Simulated Annealing. Our algorithm employs a globally reaching candidate generator, adaptive stochastic acceptance probabilities, and converges in probability to the optimal value. An application to simulation-optimization problems with asymptotically diminishing errors is presented. Numerical results on a noisy protein-folding problem are included.

متن کامل

A Model Reference Adaptive Search Method for Stochastic Global Optimization

We propose a new method called Stochastic Model Reference Adaptive Search (SMRAS) for finding a global optimal solution to a stochastic optimization problem in situations where the objective functions cannot be evaluated exactly, but can be estimated with some noise (or uncertainty), e.g., via simulation. SMRAS is a generalization of the recently proposed Model Reference Adaptive Search (MRAS) ...

متن کامل

An adaptive gradient sampling algorithm for non-smooth optimization

We present an algorithm for the minimization of f : Rn → R, assumed to be locally Lipschitz and continuously differentiable in an open dense subset D of Rn. The objective f may be nonsmooth and/or nonconvex. The method is based on the gradient sampling algorithm (GS) of Burke, Lewis, and Overton [SIAM J. Optim., 15 (2005), pp. 751-779]. It differs, however, from previously proposed versions of ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Automatic Control

سال: 2014

ISSN: 0018-9286,1558-2523

DOI: 10.1109/tac.2014.2310052